GRADE ESTIMATION OF KULLBACK - LEIBLER INFORMATION NeTMBEW

نویسنده

  • J. Mielniczuk
چکیده

An estimator of the Kullback-Leibler information number by using its representation as a functional of the grade density is introduced. Its strong consistency is proved under the mild conditions on the grade density. The same approach is used to study the entropy measure of bivariate dependence (mutual information). Some applications to detection theory are also given.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Comparison of Kullback-Leibler, Hellinger and LINEX with Quadratic Loss Function in Bayesian Dynamic Linear Models: Forecasting of Real Price of Oil

In this paper we intend to examine the application of Kullback-Leibler, Hellinger and LINEX loss function in Dynamic Linear Model using the real price of oil for 106 years of data from 1913 to 2018 concerning the asymmetric problem in filtering and forecasting. We use DLM form of the basic Hoteling Model under Quadratic loss function, Kullback-Leibler, Hellinger and LINEX trying to address the ...

متن کامل

Using Kullback-Leibler distance for performance evaluation of search designs

This paper considers the search problem, introduced by Srivastava cite{Sr}. This is a model discrimination problem. In the context of search linear models, discrimination ability of search designs has been studied by several researchers. Some criteria have been developed to measure this capability, however, they are restricted in a sense of being able to work for searching only one possibl...

متن کامل

Bootstrap Estimate of Kullback-leibler Information for Model Selection Bootstrap Estimate of Kullback-leibler Information for Model Selection

Estimation of Kullback-Leibler amount of information is a crucial part of deriving a statistical model selection procedure which is based on likelihood principle like AIC. To discriminate nested models, we have to estimate it up to the order of constant while the Kullback-Leibler information itself is of the order of the number of observations. A correction term employed in AIC is an example to...

متن کامل

A Geometric Approach to Statistical Estimation

The role of Kerridge inaccuracy, Shannon entropy and Kullback-Leibler distance in statistical estimation is shown for both dicrete and continuous observations. The cases of data independence and regression-type dependence are considered in parallel. Pythagorean-like relations valid for probability distributions are presented and their importance for estimation under compressed data is indicated.

متن کامل

Model Confidence Set Based on Kullback-Leibler Divergence Distance

Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008